# Efficient Language Models
Openelm 3B
OpenELM is a set of open-source efficient language models that employ a hierarchical scaling strategy to optimize parameter allocation and improve model accuracy. It includes four parameter scales: 270M, 450M, 1.1B, and 3B, offering both pre-trained and instruction-tuned versions.
Large Language Model
Transformers

O
apple
1,436
123
Openelm 270M
OpenELM is a set of open-source efficient language models that adopt a hierarchical scaling strategy to efficiently allocate parameters in each layer of the Transformer model, improving accuracy.
Large Language Model
Transformers

O
apple
4,719
74
Featured Recommended AI Models